Chapter 28 Tail Inequalities

نویسنده

  • David Hilbert
چکیده

Definition 28.1.2 The variance of a random variable X with expectation μ = E[X] is the quantity V[X] = E [ (X − μ) ] = E [ X2 ] − μ2. The standard deviation of X is σX = √ V[X]. Theorem 28.1.3 (Chebychev inequality) Let X be a random variable with μx = E[X] and σx be the standard deviation of X. That is σX = E [ (X − μx) ] . Then, Pr [|X − μX | ≥ tσX] ≤ 1 t2 . This work is licensed under the Creative Commons Attribution-Noncommercial 3.0 License. To view a copy of this license, visit http://creativecommons.org/licenses/by-nc/3.0/ or send a letter to Creative Commons, 171 Second Street, Suite 300, San Francisco, California, 94105, USA.

منابع مشابه

Some Probability Inequalities for Quadratic Forms of Negatively Dependent Subgaussian Random Variables

In this paper, we obtain the upper exponential bounds for the tail probabilities of the quadratic forms for negatively dependent subgaussian random variables. In particular the law of iterated logarithm for quadratic forms of independent subgaussian random variables is generalized to the case of negatively dependent subgaussian random variables.

متن کامل

Program Committee Local Committee Special Session Organizers Wednesday 28

Inversion of perturbed linear operators that are singular at the origin 17:00 – 17:25 Alex Rubinov Abstract Convexity and Hermite-Hadamard-type Inequalities Long memory and heavy tails in stochastic modeling with application to finance

متن کامل

Tail inequalities for sums of random matrices that depend on the intrinsic dimension

This work provides exponential tail inequalities for sums of random matrices that depend only on intrinsic dimensions rather than explicit matrix dimensions. These tail inequalities are similar to the matrix versions of the Chernoff bound and Bernstein inequality except with the explicit matrix dimensions replaced by a trace quantity that can be small even when the explicit dimensions are large...

متن کامل

Inequalities between hypergeometric tails

A special inequality between the tail probabilities of certain related hypergeometrics was shown by Seneta and Phipps [19] to suggest useful ‘quasi-exact’ alternatives to Fisher’s [5] Exact Test. With this result as motivation, two inequalities of Hájek and Havránek [6] are investigated in this paper and are generalised to produce inequalities in the form required. A parallel inequality in bino...

متن کامل

Chapter 26 Tail Inequalities

Theorem 26.1.2 (Chebychev inequality) Let X be a random variable with μx = E[X] and σx be the standard deviation of X. That is σX = E [ (X − μx) ] . Then, Pr [|X − μX | ≥ tσX] ≤ 1 t2 . Proof: Note that Pr [|X − μX | ≥ tσX] = Pr[(X − μX) ≥ t2σ2X] . Set Y = (X − μX). Clearly, E [ Y ] = σX. Now, apply Markov inequality to Y . This work is licensed under the Creative Commons Attribution-Noncommerci...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

متن کامل
عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010